Alternating minimization for dictionary learning with random initialization

نویسندگان

  • Niladri S. Chatterji
  • Peter L. Bartlett
چکیده

We present theoretical guarantees for an alternating minimization algorithm for the dictionary learning/sparse coding problem. The dictionary learning problem is to factorize vector samples y, y, . . . , y into an appropriate basis (dictionary) A∗ and sparse vectors x1∗, . . . , xn∗. Our algorithm is a simple alternating minimization procedure that switches between l1 minimization and gradient descent in alternate steps. Dictionary learning and specifically alternating minimization algorithms for dictionary learning are well studied both theoretically and empirically. However, in contrast to previous theoretical analyses for this problem, we replace the condition on the operator norm (that is, the largest magnitude singular value) of the true underlying dictionary A∗ with a condition on the matrix infinity norm (that is, the largest magnitude term). This not only allows us to get convergence rates for the error of the estimated dictionary measured in the matrix infinity norm, but also ensures that a random initialization will provably converge to the global optimum. Our guarantees are under a reasonable generative model that allows for dictionaries with growing operator norms, and can handle an arbitrary level of overcompleteness, while having sparsity that is information theoretically optimal. We also establish upper bounds on the sample complexity of our algorithm.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Analysis of Fast Alternating Minimization for Structured Dictionary Learning

Methods exploiting sparsity have been popular in imaging and signal processing applications including compression, denoising, and imaging inverse problems. Data-driven approaches such as dictionary learning enable one to discover complex image features from datasets and provide promising performance over analytical models. Alternating minimization algorithms have been particularly popular in di...

متن کامل

Global optimization of factor models and dictionary learning using alternating minimization

Learning new representations in machine learning is often tackled using a factorization of the data. For many such problems, including sparse coding and matrix completion, learning these factorizations can be difficult, in terms of efficiency and to guarantee that the solution is a global minimum. Recently, a general class of objectives have been introduced, called induced regularized factor mo...

متن کامل

A fast algorithm for general matrix factorization

Matrix factorization algorithms are emerging as popular tools in many applications, especially dictionary learning method for recovering biomedical image data from noisy and ill-conditioned measurements. We introduce a novel dictionary learning algorithm based on augmented Lagrangian (AL) approach to learn dictionaries from exemplar data and it can be extended to general matrix factorization pr...

متن کامل

Learning Sparsely Used Overcomplete Dictionaries

We consider the problem of learning sparsely used overcomplete dictionaries, where each observation is a sparse combination of elements from an unknown overcomplete dictionary. We establish exact recovery when the dictionary elements are mutually incoherent. Our method consists of a clustering-based initialization step, which provides an approximate estimate of the true dictionary with guarante...

متن کامل

Global optimization of factor models using alternating minimization

Learning new representations in machine learning is often tackled using a factorization of the data. For many such problems, including sparse coding and matrix completion, learning these factorizations can be difficult, in terms of efficiency and to guarantee that the solution is a global minimum. Recently, a general class of objectives have been introduced, called induced regularized factor mo...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2017